85 research outputs found

    The Role of Emotion Projection, Sexual Desire, and Self-Rated Attractiveness in the Sexual Overperception Bias

    Get PDF
    A consistent finding in the literature is that men overperceive sexual interest in women (i.e., sexual overperception bias). Several potential mechanisms have been proposed for this bias, including projecting one’s own interest onto a given partner, sexual desire, and self-rated attractiveness. Here, we examined the influence of these factors in attraction detection accuracy during speed-dates. Sixty-seven participants (34 women) split in four groups went on a total of 10 speed-dates with all opposite-sex members of their group, resulting in 277 dates. The results showed that attraction detection accuracy was reliably predicted by projection of own interest in combination with participant sex. Specifically, men were more accurate than women in detecting attraction when they were not interested in their partner compared to when they were interested. These results are discussed in the wider context of arousal influencing detection of partner attraction

    Multimodal mate choice: Exploring the effects of sight, sound, and scent on partner choice in a speed-date paradigm

    Get PDF
    When people meet a potential partner for the first time, they are confronted with multiple sources of information, encompassing different modalities, that they can use to determine whether this partner is suitable for them or not. While visual attractiveness has widely been studied with regard to partner choice, olfactory and auditory cues have received less attention, even though they might influence the attitudes that people have towards their partner. Therefore, in this study, we employed a combination of pre-date multimodal rating tasks followed by speed-date sessions. This offered a naturalistic setup to study partner choice and disentangle the relative effects of a priori attractiveness ratings of sight, scent and sound on date success. Visual attractiveness ratings showed a strong positive correlation with propensity to meet the partner again, while the effects of olfactory and auditory attractiveness were negligible or not robust. Furthermore, we found no robust sex differences in the importance of the three modalities. Our findings underscore the relative importance of visual attractiveness in initial mate choice, but do not corroborate the idea that static pre-date measures of auditory and olfactory attractiveness can predict first date outcomes

    People that score high on psychopathic traits are less likely to yawn contagiously

    Get PDF
    Considerable variation exists in the contagiousness of yawning, and numerous studies have been conducted to investigate the proximate mechanisms involved in this response. Yet, findings within the psychological literature are mixed, with many studies conducted on relatively small and homogeneous samples. Here, we aimed to replicate and extend upon research suggesting a negative relationship between psychopathic traits and yawn contagion in community samples. In the largest study of contagious yawning to date (N = 458), which included both university students and community members from across 50 nationalities, participants completed an online study in which they self-reported on their yawn contagion to a video stimulus and completed four measures of psychopathy: the primary and secondary psychopathy scales from the Levenson Self-Report Psychopathy Scale (LSRPS), the psychopathy construct from the Dirty Dozen, and the Psychopathic Personality Traits Scale (PPTS). Results support previous findings in that participants that yawned contagiously tended to score lower on the combined and primary measures of psychopathy. That said, tiredness was the strongest predictor across all models. These findings align with functional accounts of spontaneous and contagious yawning and a generalized impairment in overall patterns of behavioral contagion and biobehavioral synchrony among people high in psychopathic traits

    Scleral pigmentation leads to conspicuous, not cryptic, eye morphology in chimpanzees

    Get PDF
    Gaze following has been argued to be uniquely human, facilitated by our depigmented, white sclera [M. Tomasello, B. Hare, H. Lehmann, J. Call, J. Hum. Evol. 52, 314–320 (2007)]—the pale area around the colored iris—and to underpin human-specific behaviors such as language. Today, we know that great apes show diverse patterns of scleral coloration [J. A. Mayhew, J. C. Gómez, Am. J. Primatol. 77, 869–877 (2015); J. O. Perea García, T. Grenzner, G. Hešková, P. Mitkidis, Commun. Integr. Biol. 10, e1264545 (2016)]. We compare scleral coloration and its relative contrast with the iris in bonobos, chimpanzees, and humans. Like humans, bonobos’ sclerae are lighter relative to the color of their irises; chimpanzee sclerae are darker than their irises. The relative contrast between the sclera and iris in all 3 species is comparable, suggesting a perceptual mechanism to explain recent evidence that nonhuman great apes also rely on gaze as a social cue.Publisher PDFPeer reviewe

    Individual attractiveness preferences differentially modulate immediate and voluntary attention

    Get PDF
    Physical attractiveness plays a crucial role in mate choice for both men and women. This is reflected in visual attention: people immediately attend towards and look longer at attractive faces, especially when they are motivated to find a partner. However, previous studies did not incorporate real-life dating decisions. Here, we aimed to combine attentional tasks with individual attractiveness ratings and a real-life mate choice context, namely a speed-dating paradigm. We investigated whether heterosexual non-committed young adults showed biases in immediate and voluntary attention towards attractive faces and preferred dating partners. In line with previous research, we found considerable individual differences in individual attractiveness preferences. Furthermore, our results showed that men had a bias towards attractive faces and preferred dating partners in the immediate attention task, while results for women were mixed. In the voluntary attention task, however, both men and women had an attentional bias towards attractive faces and preferred dating partners. Our results suggest that individual attractiveness preferences are good predictors of especially voluntary attention. We discuss these findings from an evolutionary perspective and suggest directions for future research

    A comparative framework of inter-individual coordination and pair-bonding

    Get PDF
    Inter-individual coordination (IIC) at the behavioral and physiological level, and its association with courtship and pair-bond maintenance, have been receiving increased attention in the scientific literature in recent years. However, there is no integrative framework combining the plethora of findings in humans and nonhuman species yet that addresses the evolutionary origins of IIC. Here, we take a comparative approach and review findings on the link between IIC and pair-bond formation, maintenance, and bi-parental care. Our review suggests that across socially monogamous species, IIC – at a behavioral and physiological level – is correlated with the likelihood of forming and retaining a pair-bond, and with reproductive success. We expand on the pair-bonding hypothesis by stating that higher levels of IIC might be beneficial for relationship quality and bi-parental care and, as a result, might also become a preferred trait in the formation and maintenance of a pair-bond. We further discuss the key questions to disentangle the evolution of IIC based on this hypothesis

    My Fear Is Not, and Never Will Be, Your Fear: On Emotions and Feelings in Animals

    Get PDF
    Do nonhuman animals (henceforth, animals) have emotions, and if so, are these similar to ours? This opinion piece aims to add to the recent debate about this question and provides a critical re-evaluation of what can be concluded about animal and human emotions. Emotions, and their cognitive interpretation, i.e., feelings, serve important survival functions. Emotions, we believe, can exist without feelings and are unconsciously influencing our behavior more than we think, and possibly more so than feelings do. Given that emotions are expressed in body and brain, they can be inferred from these measures. We view feelings primarily as private states, which may be similar across closely related species but remain mostly inaccessible to science. Still, combining data acquired through behavioral observation with data obtained from noninvasive techniques (e.g., eyetracking, thermography, hormonal samples) and from cognitive tasks (e.g., decision-making paradigms, cognitive bias, attentional bias) provides new information about the inner states of animals, and possibly about their feelings as well. Given that many other species show behavioral, neurophysiological, hormonal, and cognitive responses to valenced stimuli equivalent to human responses, it seems logical to speak of animal emotions and sometimes even of animal feelings. At the very least, the contemporary multi-method approach allows us to get closer than ever before. We conclude with recommendations on how the field should move forward

    A sensorimotor control framework for understanding emotional communication and regulation

    Get PDF
    JHGW and CFH are supported by the Northwood Trust. TEVR was supported by a National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088785). RP and MW were supported by the the Australian Research Council (ARC) Centre of Excellence for Cognition and its Disorders (CE110001021)Peer reviewedPublisher PD

    Social context influences recognition of bodily expressions

    Get PDF
    Previous studies have shown that recognition of facial expressions is influenced by the affective information provided by the surrounding scene. The goal of this study was to investigate whether similar effects could be obtained for bodily expressions. Images of emotional body postures were briefly presented as part of social scenes showing either neutral or emotional group actions. In Experiment 1, fearful and happy bodies were presented in fearful, happy, neutral and scrambled contexts. In Experiment 2, we compared happy with angry body expressions. In Experiment 3 and 4, we blurred the facial expressions of all people in the scene. This way, we were able to ascribe possible scene effects to the presence of body expressions visible in the scene and we were able to measure the contribution of facial expressions to the body expression recognition. In all experiments, we observed an effect of social scene context. Bodily expressions were better recognized when the actions in the scenes expressed an emotion congruent with the bodily expression of the target figure. The specific influence of facial expressions in the scene was dependent on the emotional expression but did not necessarily increase the congruency effect. Taken together, the results show that the social context influences our recognition of a person’s bodily expression

    RCEA: Real-time, Continuous Emotion Annotation for collecting precise mobile video ground truth labels

    Get PDF
    Collecting accurate and precise emotion ground truth labels for mobile video watching is essential for ensuring meaningful predictions. However, video-based emotion annotation techniques either rely on post-stimulus discrete self-reports, or allow real-time, continuous emotion annotations (RCEA) only for desktop settings. Following a user-centric approach, we designed an RCEA technique for mobile video watching, and validated its usability and reliability in a controlled, indoor (N=12) and later outdoor (N=20) study. Drawing on physiological measures, interaction logs, and subjective workload reports, we show that (1) RCEA is perceived to be usable for annotating emotions while mobile video watching, without increasing users' mental workload (2) the resulting time-variant annotations are comparable with intended emotion attributes of the video stimuli (classification error for valence: 8.3%; arousal: 25%). We contribute a validated annotation technique and associated annotation fusion method, that is suitable for collecting fine-grained emotion annotations while users watch mobile videos
    corecore